Generative Knowledge Transfer for Neural Language Models

نویسندگان

  • Sungho Shin
  • Kyuyeon Hwang
  • Wonyong Sung
چکیده

In this paper, we propose a generative knowledge transfer technique that trains an RNN based language model (student network) using text and output probabilities generated from a previously trained RNN (teacher network). The text generation can be conducted by either the teacher or the student network. We can also improve the performance by taking the ensemble of soft labels obtained from multiple teacher networks. This method can be used for privacy conscious language model adaptation because no user data is directly used for training. Especially, when the soft labels of multiple devices are aggregated via a trusted third party, we can expect very strong privacy protection.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Generative Analysis of the Acquisition of Negation by Iranian EFL Learners: A Typological Study

The present study was an attempt to investigate the acquisition of negationproperties by Persian monolingual and Kurdish-Persian bilingual learners of Englishacross different levels of language proficiency and within a generative framework.Generative models are generally concerned with issues such as universal grammar(UG), language transfer, and morphological variability in nonprimary languaged...

متن کامل

Learning Deep Generative Models of Graphs

Graphs are fundamental data structures which concisely capture the relational structure in many important real-world domains, such as knowledge graphs, physical and social interactions, language, and chemistry. Here we introduce a powerful new approach for learning generative models over graphs, which can capture both their structure and attributes. Our approach uses graph neural networks to ex...

متن کامل

Text Generation using Generative Adversarial Training

Generative models reduce the need of acquiring laborious labeling for the dataset. Text generation techniques can be applied for improving language models, machine translation, summarization, and captioning. This project experiments on different recurrent neural network models to build generative adversarial networks for generating texts from noise. The trained generator is capable of producing...

متن کامل

Generative Incremental Dependency Parsing with Neural Networks

We propose a neural network model for scalable generative transition-based dependency parsing. A probability distribution over both sentences and transition sequences is parameterised by a feedforward neural network. The model surpasses the accuracy and speed of previous generative dependency parsers, reaching 91.1% UAS. Perplexity results show a strong improvement over n-gram language models, ...

متن کامل

Generative Deep Neural Networks for Dialogue: A Short Review

Researchers have recently started investigating deep neural networks for dialogue applications. In particular, generative sequence-to-sequence (Seq2Seq) models have shown promising results for unstructured tasks, such as word-level dialogue response generation. The hope is that such models will be able to leverage massive amounts of data to learn meaningful natural language representations and ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016